In mathematical logic and theoretical computer science, an abstract rewriting system (also (abstract) reduction system or abstract rewrite system; abbreviation ARS) is a formalism that captures the quintessential notion and properties of rewriting systems. In its simplest form, an ARS is simply a set (of "objects") together with a binary relation, traditionally denoted with ; this definition can be further refined if we index (label) subsets of the binary relation. Despite its simplicity, an ARS is sufficient to describe important properties of rewriting systems like normal forms, termination, and various notions of confluence.
Historically, there have been several formalizations of rewriting in an abstract setting, each with its idiosyncrasies. This is due in part to the fact that some notions are equivalent, as we shall see in this article. The formalization that is most commonly encountered in monographs and textbooks, and which we generally follow here, is due to Gérard Huet (1980).[1]
Contents |
We need to specify a set of objects and the rules that can be applied to transform them. The most general (unidimensional) setting of this notion is called an abstract reduction system, (abbreviated ARS), although more recently authors use abstract rewriting system as well.[2] (The preference for the word "reduction" here instead of "rewriting" constitutes a departure from the uniform use of "rewriting" in the names of systems that are particularizations of ARS. Because the word "reduction" does not appear in the names of more specialized systems, in older texts reduction system is a synonym for ARS).[3]
An ARS is simply a set A, whose elements are usually called objects, together with a binary relation on A, traditionally denoted by →, and called the reduction relation, rewrite relation[4] or just reduction.[5] This (entrenched) terminology using "reduction" is a little misleading, because the relation is not necessarily reducing some measure of the objects; this will become more apparent when we discuss string rewriting systems further in this article.
In some contexts it may be beneficial to distinguish between some subsets of the rules, i.e. some subsets of the reduction relation →, e.g. the entire reduction relation may consist of associativity and commutativity rules. Consequently, some authors define the reduction relation → as the indexed union of some relations; for instance if , the notation used is (A, →1, →2).
As a mathematical object, an ARS is exactly the same as unlabeled state transition system, and if we consider the relation as an indexed union, then an ARS is the same as a labeled state transition system with the indices being the labels. The focus of the study, and the terminology are different however. In a state transition system one is interested in interpreting the labels as actions, whereas in an ARS the focus is on how objects may be transformed (rewritten) into others.[6]
Suppose the set of objects is T = {a, b, c} and the binary relation is given by the rules a → b, b → a, a → c, and b → c. Observe that these rules can be applied to both a and b in any fashion to get c. Such a property is clearly an important one. Note also, that c is, in a sense, a "simplest" object in the system, since nothing can be applied to c to transform it any further.
Example 1 leads us to define some important notions in the general setting of an ARS. First we need some basic notions and notations.[7]
An object x in A is called reducible if there exist some other y in A and ; otherwise it is called irreducible or a normal form. An object y is called a normal form of x if , and y is irreducible. If x has a unique normal form, then this is usually denoted with . In example 1 above, c is a normal form, and . If every object has at least one normal form, the ARS is called normalizing.
One of the important problems that may be formulated in an ARS is the word problem: given x and y are they equivalent under ? This is a very general setting for formulating the word problem for the presentation of an algebraic structure. For instance, the word problem for groups is a particular case of an ARS word problem. Central to an "easy" solution for the word problem is the existence of unique normal forms: in this case if two objects have the same normal form, then they are equivalent under . The word problem for an ARS is undecidable in general.
A related, but weaker notion than the existence of normal forms is that of two objects being joinable: x and y are said joinable if the exists some z with the property that . From this definition, it's apparent one may define the joinability relation as , where is the composition of relations. Joinability is usually denoted, somewhat confusingly, also with , but in this notation the down arrow is a binary relation, i.e. we write if x and y are joinable.
An ARS is said to possess the Church-Rosser property if and only if implies for all objects x, y. Equivalently, the Church-Rosser property means that the reflexive transitive symmetric closure is contained in the joinability relation. Alonzo Church and J. Barkley Rosser proved in 1936 that lambda calculus has this property;[8] hence the name of the property.[9] (The fact that lambda calculus has this property is also known as the Church-Rosser theorem.) In an ARS with the Church-Rosser property the word problem may be reduced to the search for a common successor. In a Church-Rosser system, an object has at most one normal form; that is the normal form of an object is unique if it exists, but it may well not exist. In lambda calculus for instance, the expression (λx.xx)(λx.xx) does not have a normal form because there exists an infinite sequence of beta reductions (λx.xx)(λx.xx) → (λx.xx)(λx.xx) → ...[10]
Various properties, simpler than Church-Rosser, are equivalent to it. The existence of these equivalent properties allows one to prove that a system is Church-Rosser with less work. Furthermore, the notions of confluence can be defined as properties of a particular object, something that's not possible for Church-Rosser. An ARS is said to be,
Theorem. For an ARS the following three conditions are equivalent: (i) it has the Church-Rosser property, (ii) it is confluent, (iii) it is semi-confluent.[11]
Corollary.[12] In a confluent ARS if then
Because of these equivalences, a fair bit of variation in definitions is encountered in the literature. For instance, in Terese the Church-Rosser property and confluence are defined to be synonymous and identical to the definition of confluence presented here; Church-Rosser as defined here remains unnamed, but is given as an equivalent property; this departure from other texts is deliberate.[13] Because of the above corollary, one may define a normal form y of x as an irreducible y with the property that . This definition, found in Book and Otto, is equivalent to common one given here in a confluent system, but it is more inclusive in a non-confluent ARS.
Local confluence on the other hand is not equivalent with the other notions of confluence given in this section, but it is strictly weaker than confluence. The typical counterexample is , which is locally confluent but not confluent.
An abstract rewriting system is said to be terminating or noetherian if there is no infinite chain . In a terminating ARS, every object has at least one normal form, thus it is normalizing. The converse is not true. In example 1 for instance, there is an infinite rewriting chain, namely , even though the system is normalizing. A confluent and terminating ARS is called convergent. In a convergent ARS, every object has a unique normal form. But it is sufficient for the system to be confluent and normalizing for a unique normal to exist for every element, as seen in example 1.
Theorem (Newman's Lemma): A terminating ARS is confluent if and only if it is locally confluent.
The original 1942 proof of this result by Newman was rather complicated. It wasn't until 1980 that Huet published a much simpler proof exploiting the fact that when is terminating we can apply well-founded induction.[14]